Exploiting tensor rank-one decomposition in probabilistic inference
نویسندگان
چکیده
We propose a new additive decomposition of probability tables – tensor rank-one decomposition. The basic idea is to decompose a probability table into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one-dimensional tables. Entries in tables are allowed to be any real number, i. e. they can be also negative numbers. The possibility of having negative numbers, in contrast to a multiplicative decomposition, opens new possibilities for a compact representation of probability tables. We show that tensor rank-one decomposition can be used to reduce the space and time requirements in probabilistic inference. We provide a closed form solution for minimal tensor rank-one decomposition for some special tables and propose a numerical algorithm that can be used in cases when the closed form solution is not known.
منابع مشابه
An Approximate Tensor-Based Inference Method Applied to the Game of Minesweeper
We propose an approximate probabilistic inference method based on the CP-tensor decomposition and apply it to the well known computer game of Minesweeper. In the method we view conditional probability tables of the exactly -out-of-k functions as tensors and approximate them by a sum of rank-one tensors. The number of the summands is min{l + 1, k − l + 1}, which is lower than their exact symmetr...
متن کاملProbabilistic inference with noisy-threshold models based on a CP tensor decomposition
The specification of conditional probability tables (CPTs) is a difficult task in the construction of probabilistic graphical models. Several types of canonical models have been proposed to ease that difficulty. Noisy-threshold models generalize the two most popular canonical models: the noisy-or and the noisy-and. When using the standard inference techniques the inference complexity is exponen...
متن کاملTensor rank-one decomposition of probability tables
We propose a new additive decomposition of probability tables tensor rank-one decomposition. The basic idea is to decompose a probability table into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one-dimensional tables. Entries in tables ...
متن کاملComputationally efficient probabilistic inference with noisy threshold models based on a CP tensor decomposition
Conditional probability tables (CPTs) of threshold functions represent a generalization of two popular models – noisy-or and noisy-and. They constitute an alternative to these two models in case they are too rough. When using the standard inference techniques the inference complexity is exponential with respect to the number of parents of a variable. In case the CPTs take a special form (in thi...
متن کاملSurvey on Probabilistic Models of Low-Rank Matrix Factorizations
Low-rank matrix factorizations such as Principal Component Analysis (PCA), Singular Value Decomposition (SVD) and Non-negative Matrix Factorization (NMF) are a large class of methods for pursuing the low-rank approximation of a given data matrix. The conventional factorization models are based on the assumption that the data matrices are contaminated stochastically by some type of noise. Thus t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Kybernetika
دوره 43 شماره
صفحات -
تاریخ انتشار 2007